Study Finds YouTube's System Sends Gun Videos to 9-year-olds
2023-05-21
LRC
TXT
大字
小字
滚动
全页
1A new study has found that YouTube's tool to suggest videos can direct young users to content about guns and violence.
2The study was based on an experiment carried out by the Tech Transparency Project.
3The nonprofit group studies social media services.
4Researchers from the group set up two YouTube accounts that simulated online activity that might be interesting to 9-year-old boys.
5The two accounts contained the exact same information.
6The only difference was that one account chose only to watch videos suggested by YouTube.
7The other ignored the video service's suggested offerings.
8The organization found the account that chose to watch YouTube's suggestions was flooded with graphic videos.
9These included videos about school shootings and instructions for making guns fully automatic.
10Many of the suggested videos violate YouTube's own policies against violent or graphic content.
11YouTube has technology tools that are meant to restrict some kinds of videos.
12But the study suggests that those tools are failing to block violent content from young users.
13The researchers involved in the study said the tools even may be sending children to videos that include extremist and violent material.
14Katie Paul leads the Tech Transparency Project.
15She said, "Video games are one of the most popular activities for kids. You can play a game like 'Call of Duty' without ending up at a gun shop - but YouTube is taking them there."
16Paul added, "It's not the video games, it's not the kids. It's the algorithms."
17An algorithm is a set of steps that are followed to complete a computing process or problem.
18Social media companies use algorithms to predict what content users might be interested in based on past watch history.
19Algorithm tools suggest that content to users.
20The accounts that clicked on YouTube's suggested videos received 382 different gun-related videos in a single month.
21The accounts that ignored YouTube's suggestions still received some gun-related videos, but only 34 in total.
22A spokeswoman for YouTube defended the platform's protections for children and noted that it requires users under age 17 to get a parent's permission before using their website.
23YouTube says accounts for users younger than 13 are linked to a parental account.
24The company noted that it offers several choices for younger viewers that are "designed to create a safer experience for tweens and teens."
25Activist groups for children have long criticized YouTube for making violent and troubling content easily available to young users.
26They say YouTube sometimes suggests videos that promote gun violence, eating disorders and self-harm.
27In some cases, YouTube has already removed some of the videos that the Tech Transparency Project identified.
28But others remain available.
29Many technology companies depend on computer programs to identify and remove content that violates their rules.
30But Paul said findings from her organization's study show that greater investments and efforts are needed to block such material.
31Justin Wagner is the director of investigations at Everytown for Gun Safety, a gun control activist group.
32He told the AP that without federal legislation, social media companies must do more to enforce their own rules.
33He added, "Children who aren't old enough to buy a gun shouldn't be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities."
34I'm Bryan Lynn.
1A new study has found that YouTube's tool to suggest videos can direct young users to content about guns and violence. 2The study was based on an experiment carried out by the Tech Transparency Project. The nonprofit group studies social media services. Researchers from the group set up two YouTube accounts that simulated online activity that might be interesting to 9-year-old boys. 3The two accounts contained the exact same information. The only difference was that one account chose only to watch videos suggested by YouTube. The other ignored the video service's suggested offerings. 4The organization found the account that chose to watch YouTube's suggestions was flooded with graphic videos. These included videos about school shootings and instructions for making guns fully automatic. 5Many of the suggested videos violate YouTube's own policies against violent or graphic content. 6YouTube has technology tools that are meant to restrict some kinds of videos. But the study suggests that those tools are failing to block violent content from young users. The researchers involved in the study said the tools even may be sending children to videos that include extremist and violent material. 7Katie Paul leads the Tech Transparency Project. She said, "Video games are one of the most popular activities for kids. You can play a game like 'Call of Duty' without ending up at a gun shop - but YouTube is taking them there." 8Paul added, "It's not the video games, it's not the kids. It's the algorithms." An algorithm is a set of steps that are followed to complete a computing process or problem. 9Social media companies use algorithms to predict what content users might be interested in based on past watch history. Algorithm tools suggest that content to users. 10The accounts that clicked on YouTube's suggested videos received 382 different gun-related videos in a single month. The accounts that ignored YouTube's suggestions still received some gun-related videos, but only 34 in total. 11A spokeswoman for YouTube defended the platform's protections for children and noted that it requires users under age 17 to get a parent's permission before using their website. 12YouTube says accounts for users younger than 13 are linked to a parental account. The company noted that it offers several choices for younger viewers that are "designed to create a safer experience for tweens and teens." 13Activist groups for children have long criticized YouTube for making violent and troubling content easily available to young users. They say YouTube sometimes suggests videos that promote gun violence, eating disorders and self-harm. 14In some cases, YouTube has already removed some of the videos that the Tech Transparency Project identified. But others remain available. 15Many technology companies depend on computer programs to identify and remove content that violates their rules. But Paul said findings from her organization's study show that greater investments and efforts are needed to block such material. 16Justin Wagner is the director of investigations at Everytown for Gun Safety, a gun control activist group. He told the AP that without federal legislation, social media companies must do more to enforce their own rules. 17He added, "Children who aren't old enough to buy a gun shouldn't be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities." 18I'm Bryan Lynn. 19Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press. 20_______________________________________________________________ 21Words in This Story 22simulate - v. to do or make something that behaves or looks like something real but is not 23graphic - adj. extremely clear and detailed 24promote - v. to urge people to like, buy or use something 25modify - v. to change something in order to improve it 26atrocity - n. an extremely violent and shocking attack 27__________________________________________________________________ 28What do you think of this story? We want to hear from you. We have a new comment system. Here is how it works: 29Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.